Opinion piece – David Williams, Director of Public Affairs at the Rank Group.
In a decade or so, when public policy students reflect on the period leading up to the publication of the Government’s long-awaited, widely-discussed and fiercely-debated White Paper, those students will be asked to reflect on the data that was provided, used and reported on as part of the process.
And they might be forgiven for wishing that they had chosen to study something else.
The issue of data, evidence and research in the gambling debate has been widely covered, but with what degree of accuracy or confidence in the robustness of much of this data?
Recent months and, indeed, years have been marked by a blizzard of headline-grabbing statistics used to underline the urgent need for a so-called public health approach to the Government’s legislative approach to gambling. Note, this is not confined to an approach to gambling related harm, but rather an approach to gambling full stop. If that isn’t motivation enough for every stakeholder in the industry to sit up and get interested, I’m at a loss as to what it will take to get folk engaged.
In some quarters, a public health approach has long since been accepted as the only way to approach gambling reform. Others have been keen to establish whether or not authorities, journalists and commentators even understand what it means and whether they agree with each other on that point. It is unrealistic to expect universal agreement regarding the merits or otherwise of a full-throated public health approach across the entire gambling landscape, but what cannot be overlooked is the dismal lack of rigour in the reporting of statistics and data that has all too often gone unchallenged and which has, to a large extent, shaped the starting point for subsequent discussions.
In both the House of Commons and, frequently and more sensationally in the House of Lords, wild statistics are presented as holy gospel, unchallenged, uninterrupted and allowed to pass into the general acceptance of a non-critiquing public (or worse still, influential stakeholders). Numbers of suicides, costs to society, levels of underage gambling ‘addiction’… every aspect of the discussion is shot through with ‘evidence’ of an industry that is seemingly out of control. The evidence is then repeated and doubled down on across traditional media and in countless social media posts. Early in the New Year, the normally excellent Casino Guardian reported, unabashed, that “the number of gambling-related suicides in the UK is currently around 500 every year”. We have little idea if that is wrong, and we’ve next to no idea if it is right. But it gets published and repeated all the same.
For far too long the gambling industry have been asleep at the wheel in permitting the misuse of ‘evidence’ to shape the discussion. This is not about a “he said, she said” tit-for-tat exchange of numbers; it is about robust interrogation of the data that keeps being rolled out and, each time it passes unchallenged, it puts down its roots. Even a gentle push against some of the more recent, high-profile ‘evidence’ leads to an easy collapse of some public health cornerstones.
The Office for Health Improvement and Disparities (OHID) have lately been busily revising and recalibrating PHE data. OHID, which has swallowed up PHE, were recently forced to publish a review of their dodgy dossier of 2021 which has ultimately been exposed as discredited. In the original paper, the estimated figure for suicides as a direct result of gambling harms was pitched at 409 a year, a figure subsequently and widely repeated by media, ministers and industry critics. Similarly, the costs of gambling-related harm were pitched at £1.27 billion. No shortage of coverage ensued, but when the Gambling Commission, subsequently and privately, shared their view that “the reality is that reliable data does not exist” it all went rather quiet.
The industry ought to have united in exposing the nonsense for precisely what it is. If that then makes for uncomfortable discussions in Whitehall or Westminster, awkward corrections in both Houses, and higher quality journalism, then so be it. We should not be afraid to call it out if it leads to an elevated level of debate and more balanced outcomes, based on more rigorous evidence sets.
Instead, we received – only last week – some revised numbers from the same OHID authors who had reluctantly been forced to review their own work. Suicides are no longer thought to number 409 a year, and are now in the range of 117 to 496. Costs are no longer pegged at £1.27 billion, they are now somewhere in the region of £1 billion to £1.8 billion. Can we now treat these revised ranges as gospel? Of course not. Regulus Partners recently published a meticulous dismantling of the methodology, conclusions and oversights in the report, which is well worth a read.
I am not for one moment overlooking the real-life and awful tragedy of suicide, nor am I diminishing the fairly obvious fact that gambling harm has societal costs. What I am arguing is that we must have confidence that, if we are going to talk about these incredibly important and sensitive issues, the reliability of the data we use must stand up to intense scrutiny. Sadly, that cannot be taken for granted, and where flimsy ‘evidence’ is put forward as fact, we all must question it.
When ranges are so wide, when methodologies are so flawed, when motivations are so conflicted and when the consequences are so severe for the entire industry and its customer base, the ultimate dereliction of duty is to shrug our shoulders and sigh. Rather, every single stakeholder in the industry ought to be motivated by a desire to see better use – and reporting – of sensitive and highly influential data.